Skip to content

fix: using te and fsdp leads to multiple device found error#1453

Merged
t-vi merged 1 commit intoLightning-AI:mainfrom
kshitij12345:fix-te-fsdp-device-mismatch
Nov 19, 2024
Merged

fix: using te and fsdp leads to multiple device found error#1453
t-vi merged 1 commit intoLightning-AI:mainfrom
kshitij12345:fix-te-fsdp-device-mismatch

Conversation

@kshitij12345
Copy link
Copy Markdown
Collaborator

Fixes the error below:

As reported by Mixology team,

NVFUSER_DISABLE=multidevice torchrun --standalone --max-restarts=0 --no-python --nproc-per-node=8 python /opt/pytorch/lightning-thunder/thunder/benchmarks/benchmark_litgpt.py --model_name Llama-3-8B --distributed_mode fsdp --shard_mode zero2 --compile inductor --checkpoint_activations False --low_precision_mode fp8-delayed-te --micro_batch_size 1 --bucketing_mode block

we see the error

RuntimeError: FSDP only supports single device modules but got params on {device(type='cuda', index=2), device(type='meta')}

@kshitij12345 kshitij12345 changed the title fix : te and fsdp leading multiple device found error fix: using te and fsdp leads to multiple device found error Nov 19, 2024
@kshitij12345 kshitij12345 marked this pull request as ready for review November 19, 2024 10:36
Copy link
Copy Markdown
Collaborator

@t-vi t-vi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @kshitij12345

@t-vi t-vi enabled auto-merge (squash) November 19, 2024 10:58
@t-vi t-vi merged commit f206afa into Lightning-AI:main Nov 19, 2024
@kshitij12345 kshitij12345 deleted the fix-te-fsdp-device-mismatch branch November 19, 2024 11:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants